Nanotechnology and the Negotiation of Novelty

نویسنده

  • Arne HESSENBRUCH
چکیده

Much of the hype around nanotechnology relies on the notion that it is novel and revolutionary. A large part of that in turn relies on the purportedly revolutionary ability of the Scanning Tunneling Microscope to manipulate individual atoms. However, novelty always involves a comparison of similarity and difference with what came before. Furthermore, the novelty of the STM was negotiated and renegotiated from the very beginning – just as the nature of nanotechnology continues to be negotiated to this day. The history of the STM sheds light on the role of promise and hype in science in general and directs our attention towards science in the public sphere. 1. The Etymology of Nano In our culture, the negotiation of novelty is commonplace. Patenting, for example, is a process to decide what counts as novel. Innovations are compared against predecessors and consequential decisions are made on the basis of similarity and difference. The same is true of the Nobel Prizes. One might even argue that all arguments can be recast as a negotiation of similarity and difference. Nanoscience and nanotechnology are often claimed to be novel and also often claimed to not be so. In this paper I want to outline the history of nano with respect to the ongoing negotiation of its novelty. The Oxford English Dictionary is a useful first port of call for this kind of endeavor: The first use of the word was already in 1974 but in an obscure publication, the Proceedings of the International Conference of Production Engineers. The second recorded use is Eric Drexler’s 1986 Engines of Creation, and that is of course the most important locus because this book was widely read. After 1986, one can see the word spread to publications with large readerships: The New Scientist, the Times Higher Education Supplement, the Washington Post, the Sunday Times, and Nature. Drexler’s Engines of Creation is a tremendously successful book, written in an upbeat tone of voice painting a rosy future of tremendous technological ability. Drexler argued that we can now build structures on the nanoscale, meaning that we can move and combine atoms and molecules as we do with LegoTM-blocks, as long as the resultant molecules are energetically stable. We can build molecules that have similar functions as the DNA-RNAprotein system found in nature, that is to say our new molecules may be engineered so as to be parts of a self-reproducing system. From this will flow new materials, new drugs, new information technologies, new human tissues, new just about everything. In the introduction to the book, Marvin Minsky, Professor at MIT (and so a credible individual in matters technical), emphasized that Drexler’s vision was not fanciful but based on a thorough knowledge of the current science and technology. The vision was compelling for two reasons. 1) The tool for moving individual atoms, the scanning tunneling microscope (STM), became well-known at just this time – it received the Nobel Prize in the same year that Engines of Creation was published (1986). 2) The combination of molecular biology, the inA. Hessenbruch: Nanotechnology and the Negotiation of Novelty 136 cipient human genome project, and the understanding of biochemical pathways made it feasible that a slightly different ensemble than the DNA-RNA-protein one could be produced and have the same kind of tremendous power as life. Much of Drexler’s book thus addresses the issue of figuring out what kinds of molecules we would want to assemble given our knowledge of molecular biology and biochemical pathways, and how to ensure that the research would be beneficial. At the very same time, in the mid-1980s, the field of artificial life came into being (Helmreich 2000; Fox-Keller 2002, pp. 269-76). Nano and A-life are natural bedfellows: one predicts new forms of life created in the laboratory, the other simulates new forms of life on the computer. Both make the creation of new forms of life in the laboratory seem less fanciful. So, much of the feasibility of the vision depended upon the feasibility of the STM’s purported control over individual atoms and upon the feasibility of alternative forms of life. And the novelty of Drexler’s vision traded upon the novelty of the STM and A-life. In this paper I will focus upon the novelty of the former. As ever, this was negotiated and renegotiated. 2. The Scanning Tunneling Microscope What is an STM and how does it work? It was described in the following way in Scientific American (Figure 1). Figure 1: Scientific American’s depiction of the STM in 1985, cf. Binnig & Rohrer 1985, p. 53. Courtesy of Ian Worpole and Scientific American Magazine. A very fine needle is brought very close to a sample surface, for example a crystal surface whose structure is to be examined. When very close, electrons might jump across the gap from sample to tip; especially if an electrical potential is applied (e.g. by connecting the tip to a battery and the sample to earth). The jump across the gap is explained within quantum mechanical theory by the phenomenon of tunneling. The electrons tunnel through the vacuum despite the classical, non-quantum mechanical theory predicting that they do not have the energy to surmount the obstacle provided by the vacuum. The tunneling electrons amount to an electrical current that can be measured with great precision. Quantum theory predicts that the tunneling current is very sensitive to the distance between tip and sample: proportional to the inverse of the distance squared. If one scans the tip across the surface, A. Hessenbruch: Nanotechnology and the Negotiation of Novelty 137 the distance between tip and sample will oscillate and so will the current. The correlation of tip position and current can thus be used to produce an image on the computer screen giving a rendition of the topology of the sample surface. The STM was invented by Gerd Binnig and Heinrich Rohrer at IBM Zurich in 1981. Their very first paper was concerned with a tunneling microscope (Binnig et al 1982a). They argued that they were able to reduce the distance between probe and surface to the dimensions of a single atom. The proof of this lay in the tunneling current measured (inversely proportional to the distance squared, a proportionality that is theoretically explainable only with the quantum mechanical notion of electrons tunneling across the vacuum between probe and surface). The main point here is that they relied on quantum mechanics. They themselves highlighted the fact of atomic resolution (Binnig et al 1982b, emphasis in original): “Surface microscopy using vacuum tunneling is demonstrated for the first time. Topographic pictures of surfaces on an atomic scale have been obtained.” 3. The Holy Grail of Atomic Resolution In order to understand this, let us examine the significance of the term “atomic resolution” for the audiences that Binnig and Rohrer addressed. With some hyperbole one might say that atomic resolution had been the holy grail in the natural sciences for at least a hundred years. 19-century scientists developed a language based on atoms as elementary building blocks with which all sorts of analytical and industrial chemistry was carried out. The concepts of the atom and of Mendeleev’s elementary table were tremendously useful. But it was agreed that there was no direct evidence of atoms and many scientists developed a pragmatic attitude, dismissing all discussions of atoms as metaphysical – beyond measurement, beyond our ken (Nye 1984). In the early 20 century, much experimental evidence emerged with radioactivity and x-rays. The visible tracks made by alpha particles in cloud chambers were very powerful (Galison 1997), and atom-talk became kosher once more. William Henry Bragg, for instance spent much of his career popularizing such talk, lecturing on BBC radio and at the Royal Institution on individual particles flying through a gas (Bragg 1925). He also spent much time developing x-rays as an analytical tool in crystallography (Andrade 1943). A broadside of x-rays will be deflected at a crystal surface, and the many deflected waves combine to produce a pattern on a photographic plate. The power of X-rays lay precisely in their atomic resolution: they yielded information on the average distances between atoms in the crystal lattice. Many similar techniques were developed to explore surfaces, especially with the growth of the semiconductor industry in the 1950s and ‘60s (Duke 1984). Scientists used light, electrons, or ions of all kinds of wavelengths or energies shooting at all kinds of angles at the surface, sometimes measuring the particles transmitted through the target, sometimes those reflected back. Knowledge of the structure of semiconductor surfaces was obviously of tremendous financial importance and so this armory of techniques became large and very sophisticated. The same techniques were used to examine metallic surfaces for which there was also tremendous industrial interest. 4. The Novelty of the Scanning Tunneling Microscope So, by the early 1980s, there was a large, if diffuse, social grouping of surface scientists, united by an understanding of, and a commitment to, an array of techniques yielding information about surfaces, often with atomic resolution, but always averaged over many atoms. In the following years, Binnig and Rohrer worked also to explain just what constituted the novelty of their new instrument. In the abstract of one paper (Binnig & Rohrer 1982) they referred to “unprecedented resolution in real space on an atomic scale” (real space in conA. Hessenbruch: Nanotechnology and the Negotiation of Novelty 138 trast to the conceptual “reciprocal space” used with diffraction techniques). They now explain: “The usual experimental methods to investigate surface structures (e.g. LEED, atom diffraction, ion channeling) are indirect in the sense that ‘test models’ are used to calculate the scattered intensity profile which is then compared with the one measured. In addition, these methods usually require periodic surface structures. The STM, on the other hand, gives 3d pictures of surface structures direct in real space” (Binnig & Rohrer 1982, p. 730). Binnig and Rohrer not only advertised their new instrument to a busy but potentially interested audience, they also had to convince them that they were credible. Some scientists directly accused them of fraud and some reviewers rejected their papers. A knee-jerk reaction of many scientists was that the resolution of an individual atom was impossible, due to the uncertainty principle, a fundamental tenet of quantum mechanics. The fact that the quantum mechanical effect of tunneling was centrally involved will have given scientists the immediate association of quantum mechanics and its somewhat different laws for the atomic length scales. The uncertainty principle may be explained in the following way. If one were to determine the position of an individual atom, then one could send out light (a photon) which, if impinging upon the atom, would change direction. The deflection of the photon would yield information about the atom’s position, but unfortunately the deflection of the photon entails the slight movement also of the atom. Thus, some uncertainty will always remain about such issues as the position of individual atoms. Most scientists learning quantum physics will learn about the uncertainty principle with examples such as the one just given. Nowadays STM users will learn that the uncertainty principle does not apply for the case of atoms embedded in a solid and that the examples used to explain the uncertainty principle apply only to free atoms. In other words, while the photon might nudge the atom, the neighboring atoms will push it back into place. But in the early 1980s, the audience will have consisted of many busy scientists whose knee-jerk reaction when hearing of atomic resolution of individual atoms was to dismiss it. Some scientists will also have had much investment in the existing techniques and have been reluctant to accept a new one that might render their expertise obsolete. Surface scientists and crystallographers were, generally speaking, proud of their facility to think in terms of both real and reciprocal space. And so Binnig and Rohrer needed to build up their own credibility. For instance, they needed a convincing theory based on quantum mechanics explaining the tunneling process. According to this theory (developed in 1983 and 1984) it is not just a question of “feeling” the topography of the surface but rather a result of the overlap of electron orbitals of the tip and sample atoms with the greatest proximity (Tersoff & Hamann 1983, García et al 1983, García et al 1984). The bottom line is that STM measurements require interpretation according to a theoretical model, and that it is not immediately obvious which model is the most appropriate. On top of all this, it is difficult to get the STM to work properly: proficient users will tell you that it might measure junk for hours and then suddenly yield sensible information. (This phenomenon, so the explanation goes, is due to the chance placement of an atom on the tip that gives it the required sharpness. That is to say, when scanning across the surface very closely, a surface atom might jump from the surface to the tip and sit in such a way as to jut out and give the tip the desired sharpness.) All of this means that other scientists had plenty of reason to dismiss Binnig and Rohrer’s results, and one might expect that those with a career invested in existing techniques would feel threatened by an instrument promising markedly better performance. Many surface scientists thus had both the motivation and the arguments to reject the STM. The politic reaction of Binnig and Rohrer was of the kind: ‘okay guys, it’s not that novel, really – relax and give us a break’ (Binnig & Rohrer 2001; the accusations of fraud are also discussed in Binnig 1989). A. Hessenbruch: Nanotechnology and the Negotiation of Novelty 139 They wrote (cf. Figure 2): “we understand the STM as a complement to present microscopy rather than a competitor. For many applications, the STM is best used in combination with another microscope” (Binnig & Rohrer 1982, p. 734). And indeed everyone used the STM in conjunction with another microscope. The proficient new STM user was able to discern obvious noise from a proper measurement by comparing the result with that obtained from another tool. The evidence yielded by the STM is mediated through quantum theoretical understanding and a profound pre-existing understanding of surfaces. (For the importance of the pre-existing understanding of surfaces, cf. Steensgaard 2001.) And importantly, the novelty of the STM was negotiated: at times it was emphasized, at times downplayed. The novelty sometimes focused on the atomic resolution but it didn’t have to. For example, the AFM, the sibling of the STM and much more widely used, does not yield atomic resolution. The utility of the instrument doesn’t require atomic resolution. But symbolically, atomic resolution mattered greatly – comparing it to the holy grail is not too much of a hyperbole, after all. Figure 2: The resolution of various microscopic techniques in 1982. The shaded area is the resolution that the STM was capable of; SEM refers to the Scanning Electron Microscope, FIM to the Field Ion Microscope and so on (Binnig & Rohrer 1982, p. 734). Courtesy of Birkhäuser Publishers Ltd. As always, a new technique becomes credible only when replicable (Collins 1985, esp. chapter 2, “The Idea of Replication”, pp. 29-49), and it took years for an STM to be built successfully outside IBM Zurich. Other IBM labs came first and by 1985 there was a small community of STM users. At this point, Scientific American picked up the story. Binnig and Rohrer wrote the article jointly with the staff of Scientific American. The staff of course knew how to address a broader audience than just the surface science community, and so the language shifted importantly. The new kind of microscope enables one to “see” surfaces “atom by atom”. The article also advertised the instrument’s versatility: it “may extend to investigators in the fields of physics, chemistry, and biology”. The next year, 1986, was the STM’s breakthrough year. Binnig and Rohrer received the Nobel Prize, and Eric Drexler published his influential Engines of Creation that also popularized the notion of nanotechnology. Drexler does refer to the STM, but not centrally. The manipulation of individual atoms is pretty much taken for granted, and he focuses A. Hessenbruch: Nanotechnology and the Negotiation of Novelty 140 much more on the implications of that purported ability, thus shifting the discourse towards artificial life and the creation of alternative life forms. 5. The Hyping of the Scanning Tunneling Microscope The story of the scanning tunneling microscope and its new siblings (collectively called scanning probe microscopes, or SPM) after 1986 primarily went off in the direction of immediate utility that is discussed by Cyrus Mody (in this volume). One might posit a continued disconnect between the actual work done with SPMs and the LEGOTM-style construction of life-like molecular systems at the foundation of the Drexlerian vision. Even the historian of science, Jed Buchwald has contributed to this disconnect by rendering an illustration of “Zippenfeld’s amazing atomic etcher”, purportedly for touching up the family’s greeting cards (Buchwald 2000, p. 205). The illustration is unreferenced and Buchwald in fact made it up himself (Buchwald 2003). One event has enhanced this disconnect more than others: IBM employees’ media stunt, writing IBM with individual atoms (Eigler & Schweizer 1990). They used “the STM at low temperatures (4K) to position individual xenon atoms on a single-crystal nickel surface with atomic precision. This capacity has allowed us to fabricate rudimentary structures of our own design, atom by atom ... the possibilities for perhaps the ultimate in device miniaturization is evident.” The paper made it straight to the front page of the issue of Nature in which it was published. The reason for its media success was of course its relevance for the Drexlerian promise/hype. It is of methodological advantage to talk about promise/hype, to retain a Janus-faced ambiguity and not decide in advance whether nanotechnology will succeed or fail (Latour 1987, p. 4). The nature of the promise requires no further explication at this point, whereas the nature of the hype does. First of all, the IBM experiment worked only at 4K, an extremely low temperature, and at high vacuum. One of Drexler’s points was that we would only be able to assemble energetically stable molecules, and IBM’s surface with patterns made by xenon atoms is not energetically stable except at these low temperatures. Furthermore, Eigler et al. were able to move atoms laterally on a surface, which is rather different from assembling a threedimensional molecule – DNA, RNA, and proteins are of course not flat. In a word, there is a tremendous disconnect between moving xenon atoms on a surface at 4K, if that is what Eigler actually does, and building large complex bio-molecules LEGOTM-style. Xenon, after all, is an inert gas, meaning that it prefers not to bond chemically. Nudging along an atom that skates on the surface without any propensity to engage with the substrate is comparatively easy; picking up a chemically active atom and placing it somewhere in a huge chemically active three-dimensional molecule is completely different. Don Eigler has continued to popularize this experiment. Visitors may experience the set-up at IBM’s Almaden Research Center in San Jose, California, and a virtual art gallery of STM-renditions of xenon atoms on a nickel surface has come into existence. In 1996, Charles Siebert “flew across the country to move an atom” and to write about it in the New York Times. That he had to fly from New York City to San Francisco indicates that we are not talking about an experiment that has proliferated greatly. Others have written or drawn other words and images with a similar set-up, but this technique is not being worked on for industrial application. Siebert ignored, and perhaps didn’t even understand, the mediated nature of his movement of single atoms. All he did was to “nudge around a single atom of the element xenon, to pick it up and put it back down, to will that atom where I wanted”. There’s no talk of a hand using a mouse in coordination with an image on a computer screen, and still less talk of what goes into the making of that image (Siebert 1996). Eigler’s program for moving the atom with the mouse even has a chirpy sound, when the atom falls into place, just as LEGOTM-bricks click when slotted together (Mody 2003). Even more A. Hessenbruch: Nanotechnology and the Negotiation of Novelty 141 than the Scientific American article of 1986, articles like Siebert’s elide the disconnect. Obviously, promise/hype sells better than pedantic arguments. But it is precisely the elision of the pedantic argument that is of interest here, the elision of the differences between atomic resolution, atomic manipulability, and the ability to assemble self-replicating molecular systems LEGOTM-style out of individual atoms. The word nanotechnology focuses our attention on the nanoscale, the scale of atoms, and this term covers a multitude of sins. Nano is simultaneously scanning probe microscopy, Eiglerian atom nudging and Drexlerian hype.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

No Negotiation, Limited Negotiation, and Extended Negotiation in Proactive Focus on Form in Vocabulary Acquisition

Negotiation, as an interactional strategy and proactive focus on form (FoF) have received increased attention in second language research. The combination of negotiation and proactive FoF, however, has not been examined in relation to L2 vocabulary learning. To address this gap, the present study investigated how the amount of negotiation and proactive FoF impacted learners’ vocabulary knowledg...

متن کامل

On the Novelty of Nanotechnology

1. Introduction Nanotechnology has from its very beginning been surrounded with an aura of novelty. For instance, on the 28 introductory pages of the report that prepared the US National Nanotech-nology Initiative (NNI), Nanotechnology Research Directions (NSTC/IWGN 1999), we read 73 times the term " new " , 15 times " novel " , 7 times " innovation " , and 21 times " revolution ". The authors ...

متن کامل

Learners’ Engagement in Meaning Negotiation and Classroom Interaction as a Function of Their Perceptions of Teachers’ Instructional Communicative Behaviors

A significant share of classroom interaction occurs between teachers and language learners. Therefore, the individual characteristics of teachers could play facilitative or impeding roles thus encouraging or discouraging learners from getting engaged in interaction and meaning negotiation attempts when interacting with their teachers. Surprisingly however, this area has attracted scant attentio...

متن کامل

Exploring Novice Raters’ Textual Considerations in Independent and Negotiated Ratings

Educators often employ various training techniques to reduce raters’ subjectivity. Negotiation is a technique which can assist novice raters to co-construct a shared understanding of the writing assessment when rating collaboratively. There is little research, however, on rating behaviors of novice raters while employing negotiation techniques and the effect of negotiation on their understandin...

متن کامل

Cross-Cultural Time Sensitivity in a Bilateral E-Negotiation System

For a long time, culture has been an influencing parameter in negotiations. Growth of international trades and business competitions has increased the importance of negotiations among countries and different cultures. Developing new technologies, particularly the use of artificial intelligence in electronic trading areas, has provided us with the application of intelligent agents to resolve cha...

متن کامل

The role of negotiation and TA in Iranians’ second language acquisition

In this study, it is attempted to survey some intervening factors leading L2 Iranian learners’ not to be successful as well, and then seeks some of the features that might be applicable to open new windows into L2 learners in Iran. Also it concerns some aspects of language learning, which have received poor attention from both pedagogical and non-pedagogical areas. This article examined some so...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005